Goto

Collaborating Authors

 general comment




General Comment: We thank all the reviewers for providing comments that have been helpful for us to reassess the

Neural Information Processing Systems

In fact, a single DeepGambler model, trained once, can outperform SN trained for different coverages. That said, some qualitative comparison are available. Also, we gave more comment on the similarity and difference between the SR and the PM in section 11.3 in the Y es, it would have been better if we were clearer about the meaning of the "uncertainty" We will use "confidence score" when




General comments: We thank all the reviewers for their insightful comments, and their unanimous positive comments

Neural Information Processing Systems

Our novelty has also been affirmed by R1, R2 and R4. However, we should clarify that (1) our work differs completely from MMD-GANs, and (2) although Ref [4] Our supplementary material includes the s.o.t.a. Below we discuss the reviewers' comments and will address all of them in the revision. Lipschitz constraint is not a necessity in our RCF-GAN. Please refer to our proof. Fig.4 in the paper shows the image reconstruction and interpolation, validating our superior performances on clear We will elaborate more upon this in the revision.



not obvious a priori (and indeed, as we explain in Section 2.2, the previously used objective (4) also has a natural

Neural Information Processing Systems

We thank the reviewers for their comments. Before we address individual concerns, we make some general comments. This new insight is an alternative, and much simpler, proof of Cohen et al.'s key This also opens up another avenue for future work, namely by finding better nonlinear Lipschitz guarantees. As mentioned above, the Expectation over Transformation attack of Athalye et al. has the opposite order of log and We will add this citation and a discussion of this interesting connection. Thus, high abstention rate leads to lower certified accuracies. Nevertheless, we will add this experiment to the final version of the paper.